Bounds on general entropy measures
نویسندگان
چکیده
We show how to determine the maximum and minimum possible values of one measure of entropy for a given value of another measure of entropy. These maximum and minimum values are obtained for two standard forms of probability distribution (or quantum state) independent of the entropy measures, provided the entropy measures satisfy a concavity/convexity relation. These results may be applied to entropies for classical probability distributions, entropies of mixed quantum states and measures of entanglement for pure states. PACS numbers: 03.67.–a ‡ Present address: Quantum Information Science Group, Department of Physics and Astronomy, University of Calgary, Calgary, Alberta T2N 1N4, Canada. Bounds on general entropy measures 2
منابع مشابه
/ 03 05 05 9 v 3 5 J an 2 00 4 Bounds on general entropy measures
We show how to determine the maximum and minimum possible values of one measure of entropy for a given value of another measure of entropy. These maximum and minimum values are obtained for two standard forms of probability distribution (or quantum state) independent of the entropy measures, provided the entropy measures satisfy a concavity/convexity relation. These results may be applied to en...
متن کاملEquivocations, Exponents and Second-Order Coding Rates under Various Rényi Information Measures
In this paper, we evaluate the asymptotics of equivocations, their exponents as well as their second-order coding rates under various Rényi information measures. Specifically, we consider the effect of applying a hash function on a source and we quantify the level of non-uniformity and dependence of the compressed source from another correlated source when the number of copies of the sources is...
متن کاملSome properties of the parametric relative operator entropy
The notion of entropy was introduced by Clausius in 1850, and some of the main steps towards the consolidation of the concept were taken by Boltzmann and Gibbs. Since then several extensions and reformulations have been developed in various disciplines with motivations and applications in different subjects, such as statistical mechanics, information theory, and dynamical systems. Fujii and Kam...
متن کاملGeneralized Source Coding Theorems and HypothesisTesting : Part I { Information
Expressions for "-entropy rate, "-mutual information rate and "-divergence rate are introduced. These quantities, which consist of the quantiles of the asymptotic information spectra, generalize the inf/sup-entropy/information/divergence rates of Han and Verd u. The algebraic properties of these information measures are rigorously analyzed, and examples illustrating their use in the computation...
متن کاملUpper Bounds on Fourier Entropy
Given a function f : {0, 1} n → R, its Fourier Entropy is de ned to be −∑S f̂(S) log f̂(S), where f̂ denotes the Fourier transform of f. This quantity arises in a number of applications, especially in the study of Boolean functions. An outstanding open question is a conjecture of Friedgut and Kalai (1996), called Fourier Entropy In uence (FEI) Conjecture, asserting that the Fourier Entropy of any ...
متن کامل